Technology Tales

Adventures & experiences in contemporary technology

A Look at a Compact System Camera

4th September 2013

During August, I acquired an Olympus Pen E-PL5 and it is an item to which I still am becoming accustomed and it looks as if that is set to continue. The main reason that it appealed to me was the idea of having a camera with much of the functionality of an SLR but with many of the dimensions of a compact camera. In that way, it was a step up from my Canon PowerShot G11 without carrying around something that was too bulky.

Olympus Pen E-PL5

Before I settled on the E-PL5, I had been looking at Canon’s EOS M and got to hear about its sluggish autofocus. That it had no mode dial on its top plate was another consideration though it does pack in an APS-C sized sensor (with Canon’s tendency to overexpose finding a little favour with me too on inspection of images from an well aged Canon EOS 10D) at a not so unappealing price of around £399. A sighting of a group of it and similar cameras in Practical Photography was enough to land that particular issue into my possession and they liked the similarly priced Olympus Pen E-PM2 more than the Canon. Though it was a Panasonic that won top honours in that test, I was intrigued enough by the Olympus option that I had a further look. Unlike the E-PM2 and the EOS M, the E-PL5 does have a mode dial on its top plate and an extra grip so that got my vote even it meant paying a little extra for it. There was a time when Olympus Pen models attracted my attention before now due to sale prices but this investment goes beyond that opportunism.

The E-PL5 comes in three colours: black, silver and white. Though I have a tendency to go for black when buying cameras, it was the silver option that took my fancy this time around for the sake of a spot of variety. The body itself is a very compact affair so it is the lens that takes up the most of the bulk. The standard 14-42 mm zoom ensures that this is not a camera for a shirt pocket and I got a black Lowepro Apex 100 AW case for it; the case fits snugly around the camera, so much so that I was left wondering if I should have gone for a bigger one but it’s been working out fine anyway. The other accessory that I added was a 37 mm Hoya HMC UV filter so that the lens doesn’t get too knocked about while I have the camera with me on an outing of one sort or another, especially when its plastic construction protrudes a lot further than I was expecting and doesn’t retract fully into its housing like some Sigma lenses that I use.

When I first gave the camera a test run, I had to work out how best to hold it. After all, the powered zoom and autofocus on my Canon PowerShot G11 made that camera more intuitive to hold and it has been similar for any SLR that I have used. Having to work a zoom lens while holding a dinky body was fiddly at first until I worked out how to use my right thumb to keep the body steady (the thumb grip on the back of the camera is curved to hold a thumb in a vertical position) while the left hand adjusted the lens freely. Having an electronic viewfinder instead of using the screen would have made life a little easier but they are not cheap and I already had spent enough money.

The next task after working out how to hold the camera was to acclimatise myself to the exposure characteristics of the camera. In my experience so far, it appears to err on the side of overexposure. Because I had set it to store images as raw (ORF) files, this could be sorted later but I prefer to have a greater sense of control while at the photo capture stage. Until now, I have not found a spot or partial metering button like what I would have on an SLR or my G11. That has meant either using exposure compensation to go along with my preferred choice of aperture priority mode or go with fully manual exposure. Other modes are available and they should be familiar to any SLR user (shutter priority, program, automatic, etc.). Currently, I am using bracketing while finding my feet after setting the ISO setting to 400, increasing the brightness of the screen and adding histograms to the playback views. With my hold on the camera growing more secure, using the dial to change exposure settings such as aperture (f/16 remains a favourite of mine in spite what others may think given the size of a micro four thirds sensor) and compensation while keeping the scene exactly the same to test out what the response to any changes might be.

While I still am finding my feet, I am seeing some pleasing results so far that encourage me to keep going; some remind me of my Pentax K10D. The E-PL5 certainly is slower to use than the G11 but that often can be a good thing when it comes to photography. That it forces a little relaxation in this often hectic world is another advantage. The G11 is having a quieter time at the moment and any episodes of sunshine offer useful opportunities for further experimentation and acclimatisation too. So far, my entry in the world of compact system cameras has revealed them to be of a very different form to those of compact fixed lens cameras or SLR’s. Neither truly get replaced and another type of camera has emerged.

Command line mapping of network drives

5th September 2007

Mapping network drives in Windows usually involves shuffling through Explorer menus. There is another way that I consider to be neater: using the Windows command line ("DOS" to some). The basic command for creating a mapping goes like this:

net use w: \\yourserver.address

To ensure persistence of the mapping across different Windows sessions, use this:

net use w: \\yourserver.address /persistent:yes

Here’s how to set up a mapping that logs in as a different user:

net use w: \\yourserver.address password /user:you

The above can include domain information as well and in a number of different forms: domain\username is one.

To delete a mapping, try this:

net use w: /delete

List all existing mappings:

net use

This is a flavour of what is available and Microsoft does provide documentation. Issuing the following command will bring some of that on the command line:

net help use

A web development toolbox

23rd March 2007

Having been on a web-building journey from Geocities to having a website with my own domain hosted by Fasthosts, it should come as no surprise that I have encountered a number of tools and technologies over this time and that my choices and knowledge have evolved too. I’ll muse over the technologies first before going on to the tools that I use.

Technologies

XHTML

When I started building websites, HTML 4 was not long in existence and I devoured most if not all of Elizabeth Castro’s Peachpit Visual Quickstart guide to the language within a weekend. Having previously used fairly primitive WYSIWYG tools like Netscape Composer and Claris Home Page, it was an empowering experience and the first edition (it is now on its third) of Jennifer Niederst Robbins’ Web Design in a Nutshell took things much further, becoming something of a bible for a number of years.

When it first appeared, XHTML 1.0 wasn’t a major change from HTML 4, but its stricter more XML-compliant syntax was meant to point the way to the future and semantic markup was at its heart at least as much as it was for HTML 4. XHTML 2.0 is on the horizon and after the modular approach of XHTML 1.1 (which I have never used), it will be interesting to see how it develops. Nevertheless, there is a surprising development in that some people are musing over the idea of having an HTML 5. Let’s hope that the (X)HTML apple cart doesn’t get completely overturned after some years of relative stability. I still bear scars from the browser wars raging in the 1990’s and don’t want to see standards wars supplanting the relative peace that we have now. That said, I don’t mind peaceful progression.

CSS

Only seems to be coming into its own in the last few years and is truly an amazing technology in spite of the hobbles that MSIE places on our ambitions. CSS Zen Garden has been a major source of ideas; I wouldn’t have been able to customise this blog as much as I have without them. I was an early adopter of the technology and got burnt by inconsistent browser support; Netscape 4 was the proverbial bête noir back then, fulfilling the role that MSIE plays today. In those days, it was the idea of controlling text display and element backgrounds from a single place that appealed. Since then, I have progressed to using CSS to replace table-based layouts and to control element positioning. It can do more…

JavaScript

Having had a JavaScript-powered photo gallery before my current Perl-driven one, I can say that I have definitely sampled this ever-pervasive scripting language. Being a client-side language rather than a server-side one, it does place you rather at the mercy of the browser purveyors and it never ceases to amaze me that there is a buzz around AJAX because of this. In fact, the abundance of AJAX cross-browser function libraries is testimony to the need for browser-specific code. Despite my preferences for server-side scripting, I still find a use for JavaScript and its main use for me these days is to dynamically control CSS elements to do such things as control the height of a page element or whether it is shown or not. Apparently, CSS may get some dynamic capabilities in the future and reduce my dependence on JavaScript. In the meantime, Jeremy Keith’s DOM Scripting (Friends of Ed) will prove as much of an asset as it has done.

XML

These days, a lot of the raw data underlying my personal website is stored in XML. I did try to dynamically transform the display of the XML into something meaningful with CSS and XSLT when I first scaled its dizzy heights but I soon resorted to other techniques. Browser support and the complexity of what I required were the major contributors to this. The new strategy involved two different approaches. The first was to create PHP/XHTML pages from the precursor XML offline and this is how I generate the website’s directory pages. The other one is to process the XML as text to dynamically supply an XHTML page as the user visits it; this is the way that the photo gallery works.

Perl

This still powers all of my photo gallery. While thoughts of changing it all to PHP linger, there is a certain something about the Perl language that keeps it there. I suppose it is that PHP is entangled in the HTML while Perl encases the whole business and I am reasonably familiar with its syntax these days which is why it still does a lot of the data processing grunt work that I need.

PHP

PHP is everywhere these days, though it doesn’t attract quite the level of hype that used to be the case. It still appears with its sidekick MySQL in many website applications. Blogging software such as WordPress and content management systems like Drupal, Mambo and Joomla! wouldn’t exist without the pair. It appears on my website as the glue that holds my visitor directories together and is the processing engine of my WordPress blog. And if I ever get to a Drupal element to the site, by no means a foregone conclusion though I am spending a lot of time with it at the moment, PHP will continue its presence in my website scripting as it powers that too.

Applications

Macromedia HomeSite

I have a liking for hand coding, so this does most of what I need. When Macromedia (itself since taken over by Adobe, of course) took over Allaire, HomeSite sadly lost its WYSIWYG capability, but the application still soldiers on even though Dreamweaver offers a lot to code cutters these days. Nevertheless, it does have certain advantages over Dreamweaver: it is a fleeter beast to start up and colour codes Perl syntax.

Macromedia Dreamweaver

There was a time when Dreamweaver was solely a tool for visual web page development, but the advent of Dreamweaver UltraDev added server-side development capabilities to the Dreamweaver family. These days, there is only one Dreamweaver version, but UltraDev’s capabilities still live on in the latest version and I would not be surprised if they were taken further in these database-driven times.

Nowadays, Dreamweaver isn’t an application where I spend a great deal of time. In former times, when my site was made up of static HTML pages, I used Dreamweaver a lot even if its rendering capabilities were a step behind the then-current browser versions. I suppose that it didn’t fit the way in which I worked, but its template-driven workflow would have been a boon back then.

However, my move from a static site to a dynamic one, starting with my photo gallery, has meant that I haven’t used it as much since then. However, with my use of PHP/MySQL components on my site. Its server-side abilities could get the level of investigation that its PHP/MySQL capabilities allow.

Altova XMLSpy Professional

Adding MySQL databases to my web hosting costs money, not a lot but it could be spent on other (more important?) things. Hence, I use XML as the data store for my photo gallery and XML files are pre-processed into XHTML/PHP pages for my visitor directories prior to uploading onto the server.

I use XMLSpy to edit and manage the XML files that I use: its ability to view XML in grid format is a killer feature as far as I am concerned and XML validation also proves very useful; particularly with regard to ensuring that DTD’s and XML files are in step and for the correct coding of XSLT files. There are other features that I need to explore and that would also take my knowledge of the XML further to boot, not at all a bad thing.

Saxon

For processing XML into another file format such as XHTML, you need a parser and I use the free version of Saxon to do the needful, Saxonica offers commercial versions of it. There is, I believe, a parser in XMLSpy but I don’t use it because Saxon’s command line interface fits better into my workflow. This is a Perl-driven process where XML files are read and XSLT files, one per XML file, are built before both are fed to Saxon for transforming into XHTML/PHP files. It all works smoothly and updating the XML inputs is all that is required.

AceFTP

If I were looking for an FTP client now, it would be FileZilla but AceFTP has served me well over the last few years and it looks as if that will continue. It does have some extra features over FileZilla: transfers between remote sites, and scheduling, for example. I have yet to use either but they look valuable.

Hutmil

In bygone days when I had loads of static HTML files, making changes was a bit of a chore if they affected every single file. An example is changing the year on the copyright message on the page footers. Hutmil, which I found on a magazine cover-mounted disc, was a great time saver in those days. Today, I achieve this by putting this information into a single file and getting Perl or PHP to import that when building the page. The same “define once, use anywhere” approach underlies CSS as well and scripting very usefully allows you to take that into the XHTML domain.

Apache

Apache is ubiquitous these days and both the online and offline versions of my site are powered by it. It does require some configuration but it is a very powerful piece of kit. The introduction of 2.2.x meant a big change in the way that configuration files were modularised and while most things were contained in a single file for 2.0.x, the settings are broken up into different files in 2.2.x and it can take a while to find things again. Without having it on my home PC, I would not be able to use Perl, PHP or MySQL. Apart from this, I especially like its virtual site capability; very useful for offline development.

WordPress

My hosting supplier offers blogs on Blogware, but that didn’t offer the level of configuration that I would have liked. It is true that this is probably true of any host of blogs. I can’t speak for Blogger but WordPress.com does have its restrictions too. To make my hillwalking blog fit in with the appearance of my photo gallery, I went popped over to WordPress.org to download WordPress so that I could host a blog myself and have maximum control over its appearance. WordPress supports themes so I created my own and got my blog pages looking as if they are part of my website, rather than looking like something that was bolted on. Now that I think of it, what about WordPress supporting user-created themes? I support that there is the worry of insecure PHP code but what about it?

MySQL

I am between minds on whether this is a technology or a tool. SQL certainly would be a technology standard but I am not so clear on what MySQL would be. In any case, I have classed it as a tool and a very useful one at that. It is the linchpin for my WordPress blogs and, if I go for a content management system like Drupal, its role would surely grow. While I do have a lot of experience with using SAS SQL and this helps me to deal with other varieties, there is still a learning curve with MySQL that gets me heading for a good book and Kofler’s The Definitive Guide to MySQL5 (Apress) seems to perform more than adequately in this endeavour.

Paint Shop Pro

As someone who hosts an online photo gallery, it won’t come as a surprise that I have had exposure to image editors. Despite various other flirtations, Paint Shop Pro has been my tool of choice over the years, but it is now set to be usurped by a member of Adobe’s Photoshop family. Paint Shop Pro does have books devoted to it but it seems that Photoshop gets better coverage and I feel that my image processing needs to be taken up a gear, hence the potential move to Photoshop

Rough?

11th November 2009

Was it because Canonical and friends kept Ubuntu in such a decent state from 8.04 through to 9.04 that things went a little quiet in the blogosphere on the subject of the well-known Linux distribution? If so, 9.10 might be proving more of a talking point and you have to wonder if this is such a good thing with the appearance of Windows 7 on the scene. Looking on the bright side, 10.04 will be an LTS release so there is some chance that any rough edges that are on display now could be resolved by next April. Even so, it might have been better not to see anything so obvious at all.

In truth, Ubuntu always has had its gaps and I have seen a few of their ilk over the last two years. Of these, a few have triggered postings on here. In fact, issues with accessing the BBC iPlayer still bring a goodly number of folk to this website. That may just be a matter of grabbing RealPlayer, now helpfully available as a DEB package, from the requisite place on the web and ensuring that Ubuntu-Restricted-Extras is in place too but you have to know that in the first place. Even so, unexpected behaviours like Palimpsest seeing every partition on a disk as a different drive and SIL Raid mappings being seen for hard drives that used to live on the main home PC that bit the dust earlier this year; it only happens on one of the machines that I have running Ubuntu so it may be hardware thing and newly added hard drive uses none of the SIL mapping either. Perhaps more seriously (is it something that a new user should be encountering?), a misfiring variant of Brasero had me moving to K3b. Then UFRaw was sluggish in batch but that’s nothing that having a Debian VM won’t overcome. Rough edges like these do get you asking if 9.10 was ready for the big time while making you reluctant to recommend it to mainstream users like my brother.

The counterpoint to the above is that 9.10 includes a host of under the bonnet changes like the introduction of Ext4 hard drive formatting, Xsplash to allow the faster system loading to occur unseen and GNOME 2.28. To someone looking in from outside like me, that looks like a lot of work and might explain the ingress of the annoyances that I have seen. Add to that the fact that we are between Debian releases so things like the optimised packaging of ImageMagick or UFRaw may not be so high up the list of the things to do, especially with the more general speed optimisations that were put in place for 9.10. With 10.04 set to be an LTS release so I’d be hoping that consolidation is the order of the day over the next five or six months but it seems to be the inclusion of new features and other such progress that get magazine reviewers giving higher ratings (Linux Format has given it a mark of 9 out of 10). With the mooted inclusion of GNOME 3 and its dramatically different interface in 10.10, they should get their fill of that. However, I’d like to see some restraint for the take of a smooth transition from the familiar GNOME 2.x to the new. If GNOME 3 stays very like its alpha builds, then the question as how users will take to it arises. Of course, there’s some time yet before we see GNOME 3 and, having seen how the Ubuntu developers transformed GNOME 2.28, I wouldn’t be surprised if the impact of any change could be dulled.

In summary, my few weeks with Ubuntu 9.10 as my main OS have thrown up no major roadblocks that would cause me to look at moving elsewhere; Fedora would be tempting if that situation were to arise. The irritations that I have seen are more like signs of a lack of polish and remain peripheral to day-to-day working if you discount CD/DVD burning. To be honest, there always have been roughnesses in Ubuntu but has the lack of sizeable change spoilt us? Whatever about how things feel afterwards, big changes can mean new problems to resolve and inspire blog posts describing any solutions so it’s not all bad. If that’s what Canonical wants to see, they might get it and the year ahead looks as if it is going to be an interesting one after a recent quieter period.

Moving a website from shared hosting to a virtual private server

24th November 2018

This year has seen some optimisation being applied to my web presences guided by the results of GTMetrix scans. It was then that I realised how slow things were, so server loads were reduced. Anything that slowed response times, such as WordPress plugins, got removed. Usage of Matomo also was curtailed in favour of Google Analytics while HTML, CSS and JS minification followed. What had yet to happen was a search for a faster server. Now, another website has been moved onto a virtual private server (VPS) to see how that would go.

Speed was not the only consideration since security was a factor too. After all, a VPS is more locked away from other users than a folder on a shared server. There also is the added sense of control, so Let’s Encrypt SSL certificates can be added using the Electronic Frontier Foundation’s Certbot. That avoids the expense of using an SSL certificate provided through my shared hosting provider and a successful transition for my travel website may mean that this one undergoes the same move.

For the VPS, I chose Ubuntu 18.04 as its operating system, and it came with the LAMP stack already in place. Have offload development websites, the mix of Apache, MySQL and PHP is more familiar to me than anything using Nginx or Python. It also means that .htaccess files become more useful than they were on my previous Nginx-based platform. Having full access to the operating system using SSH helps too and should mean that I have fewer calls on technical support since I can do more for myself. Any extra tinkering should not affect others either, since this type of setup is well known to me and having an offline counterpart means that anything riskier is tried there beforehand.

Naturally, there were niggles to overcome with the move. The first to fix was to make the MySQL instance accept calls from outside the server so that I could migrate data there from elsewhere, and I even got my shared hosting setup to start using the new database to see what performance boost it might give. To make all this happen, I first found the location of the relevant my.cnf configuration file using the following command:

find / -name my.cnf

Once I had the right file, I commented out the following line that it contained and restarted the database service afterwards using another command to stop the appearance of any error 111 messages:

bind-address 127.0.0.1
service mysql restart

After that, things worked as required and I moved onto another matter: uploading the requisite files. That meant installing an FTP server, so I chose proftpd since I knew that well from previous tinkering. Once that was in place, file transfer commenced.

When that was done, I could do some testing to see if I had an active web server that loaded the website. Along the way, I also instated some Apache modules like mod-rewrite using the a2enmod command, restarting Apache each time I enabled another module.

Then, I discovered that Textpattern needed php-7.2-xml installed, so the following command was executed to do this:

apt install php7.2-xml

Then, the following line was uncommented in the correct php.ini configuration file that I found using the same method as that described already for the my.cnf configuration and that was followed by yet another Apache restart:

extension=php_xmlrpc.dll

Addressing the above issues yielded enough success for me to change the IP address in my Cloudflare dashboard so it pointed at the VPS and not the shared server. The changeover happened seamlessly without having to await DNS updates as once would have been the case. It had the added advantage of making both WordPress and Textpattern work fully.

With everything working to my satisfaction, I then followed the instructions on Certbot to set up my new Let’s Encrypt SSL certificate. Aside from a tweak to a configuration file and another Apache restart, the process was more automated than I had expected, so I was ready to embark on some fine-tuning to embed the new security arrangements. That meant updating .htaccess files and Textpattern has its own, so the following addition was needed there:

RewriteCond %{HTTPS} !=on
RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]

This complemented what was already in the main .htaccess file and WordPress allows you to include http(s) in the address it uses, so that was another task completed. The general .htaccess only needed the following lines to be added:

RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.assortedexplorations.com/$1 [R,L]

What all these achieve is to redirect insecure connections to secure ones for every visitor to the website. After that, internal hyperlinks without https needed updating along with any forms so that a padlock sign could be shown for all pages.

With the main work completed, it was time to sort out a lingering niggle regarding the appearance of an FTP login page every time a WordPress installation or update was requested. The main solution was to make the web server account the owner of the files and directories, but the following line was added to wp-config.php as part of the fix even if it probably is not necessary:

define('FS_METHOD', 'direct');

There also was the non-operation of WP Cron and that was addressed using WP-CLI and a script from Bjorn Johansen. To make double sure of its effectiveness, the following was added to wp-config.php to turn off the usual WP-Cron behaviour:

define('DISABLE_WP_CRON', true);

Intriguingly, WP-CLI offers a long list of possible commands that are worth investigating. A few have been examined, but more await attention.

Before those, I still need to get my new VPS to send emails. So far, sendmail has been installed, the hostname changed from localhost and the server restarted. More investigations are needed, but what I have not is faster than what was there before, so the effort has been rewarded already.

Online favicon.ico creation

21st January 2008

I recently updated the icon that appears beside this blog’s address in the address bar and bookmarks menus of some browsers. I gave it a go in GIMP but I seemed to get no joy. I pottered out on the web to discover what I might have done wrong only to find Dynamic Drive offering online favicon.ico creation. Out of curiosity, I decided to give the thing a whirl and download the result to upload onto my web server. GIF’s, JPG’s PNG’s and BMP’s with a size less than 150 KB are accepted and it did work for me.

Photography Kit

7th July 2008

Photography Kit

This is a list that I want to build up over time and I am going to limit it to the U.K. for now. As should be apparent from any commentary that I have included, I have dealt with a few of the retailers that are listed below so I hope that it comes in useful.

7dayshop.com

My biggest purchase from this Guernsey-based lot was a Canon EOS 10D body that heralded the start of my journey into the world of digital photography at the beginning of 2005. There was a time when I was wont to buy film from them too, along with other bits and pieces but I then turned to Mailshots in Stoke-on-Trent for similar pricing and quicker delivery; it often took weeks for things to arrive from Guernsey after purchase.

Ace Optics

Cameraworld

Ffordes

Prior to my entry into the world of digital photography, this lot became a port of call for several pre-owned film cameras. A Minolta X-700 came from there in 2002 as did compatible Sigma lenses and a flash gun. During 2004, I traded in my Canon EOS 300 for an EOS 30 that they had on sale and an EOS 50E was acquired as a second body. A piece of fooling resulting from a lapse of concentration while on a visit to Harris in August has meant that the 50E has been pressed into service as my main film camera on any outings; it’s always good to have a spare and prices these days are more tempting than when I was buying second-hand equipment.

Jessops

This is a name in photographic retailing that has been brought back from the dead. Before its collapse, it was the major retailer in Britain’s town centres and there was a branch in Macclesfield. However, the focus is more on online sales now with there only being a small network of city centre stores like the one on Market Street in Manchester. Having Jessops back is no bad thing and I wish them well for it was at a branch in Stockport that I bought my first-ever SLR, a Canon EOS 300, in July 2001. Purchases of Sigma lenses followed: a 70-300 mm one in Stockport and a 28-135 mm in Manchester. Admittedly, the latter of these saw more use than the former, but that always happens to me: I seem to be a one body, one lens man most of the time and it is only the prospect of a lost in quality that seems to keep me away from using super-zoom lenses.

London Camera Exchange

Mifsuds

Park Cameras

It seems to have been Sigma lenses for my Pentax DSLR’s that I have been buying from these people. The first was an 18-125 mm offering that is the main one that I use and next came a 50-200 mm one that extends my photographic range further into the telephoto region. That I made the second purchase from them may surprise some given that there was a lengthy wait for the first one but I may have asked for a less common item and I allowed for this. The 50-200 mm lens was a far more timely arrival and there may be more purchases from them yet, subject to my actually having a need to do so.

Picstop

A card reader and SD cards have been what makes up the custom that I have given this bunch. Delivery from the Isle of Man is quicker than from Jersey but you do incur additional charges even if you get that for which you are paying.

SRS Microsystems

Wex Photo Video

Formerly known as Warehouse Express, this operation has occasionally tempted me with promising goods at appealing prices. In the early days, a Sekonic light meter came from them but they now are a first port of call when pondering the prospect of a photographic purchase. Various cameras, lenses, filters and bags have been sourced there over the years.

Camera tales

20th January 2007

Anyone who has ever been on HennessyBlog will know that I enjoy walking in the countryside and that I always have a camera with me when I do. Like many digital SLR owners, I am beginning to see the tell-tale spots in my photos that are caused by a dirty sensor. And it isn’t that I am continually changing lenses either: I rarely remove the Sigma 18-50 mm DC zoom lens that I use with the camera. Rather than trusting myself with the cleaning (I have had a go already without much success), I am giving serious consideration to letting the professionals take care of my Canon EOS 10D, my only digital camera. I have already been quoted something of the order of £35 by a Canon service centre not far from me and am seriously considering taking them up on the offer.

Of course, sending it away to them means that I will have to forego the ability to include photos with posts on HennessyBlog describing my walks in the kind of timescale to which I have become accustomed; of course, this is where digital really scores. I will still have a camera with me as film remains my mainstay, even in this digital age. The camera in question is another Canon, an EOS 30 that I acquired used from Ffordes Photographic. While taking a recent peek at their website, I have just spied a used EOS 1V going for £399, a song for what remains Canon’s top of the range film SLR. Yes, I am tempted but I must stay real. In fact, I did not pay full price for my EOS 10D. That was part of the run-out stock that 7dayshop.com were selling off at next to half of the EOS 10D’s original asking price in the wake of its being superseded by the EOS 20D (itself since replaced by the EOS 30D: digital is a fast-moving world).

Sending a camera away for attention is not new to me as I also acquired a used Minolta X-700 manual focus SLR, again obtained from Ffordes, and that needed a spot of maintenance after a year in my possession. There was a problem with the shutter that cost me £75 to get Minolta to fix. Now that Minolta as a camera maker is no more, I was wondering who would attend to it in the future. That question was answered by a recent look on the web: in the UK it is JP Service Solutions, a division of Johnson’s Phototopia. Konica Minolta retain this information on their website. Konica Minolta’s failure to capitalise effectively on the digital revolution in its early days, particularly in the SLR area where they gifted their competitors a massive head start, cost them their future in the photographic business and now Sony continues the mantle, a sad end to one of camera manufacturing’s great innovators.

Returning to my digital-less dilemma, I suppose that I could get another digital for backup duty; I have to admit that a DSLR is a bulky contraption to be carrying in airline luggage. The camera that has made it onto my wish list is Ricoh’s GR Digital, a highly regarded offering that follows in the great tradition of its film forbears, the GR 1 and GR 21. Given that my first 35 mm camera was a Ricoh, and I have it still, this would be a case of returning to my roots. Of course, having it on a wish list is very different from having it on the to-do list and finances will certainly dictate if the purchase is made, though a finance deal offered by Warehouse Express does make it more accessible. Maybe some day…

Ricoh GR Digital

A case of “peekaboo” behaviour in Internet Explorer

1st July 2008

Recently, I changed the engine of my online photo gallery to a speedier PHP/MySQL-based affair from its PHP/Perl/XML-powered predecessor. On the server side, all was well, but a peculiar display issue turned up in Internet Explorer (6, 7 & 8 were afflicted by this behaviour) where photo caption text on the thumbnail gallery pages was being displayed erratically.

As far as I can gather, the trigger for the behaviour was that the thumbnail block was placed within a DIV floated using CSS that touched another DIV that cleared the floating behaviour. I use a table to hold the images and their associated captions in place. Furthermore, each caption was also a hyperlink nested within a set of P tags.

The remedy was to set the CSS Display property for the affected XHTML tag to a value of “inline-block”. Within a DIV, TABLE, TR, TD, P and A tag hierarchy, finding the right tag where the CSS property in question has the desired effect took some doing. As it happened, it was the tag set, that for the hyperlink, at the bottom of the stack that needed the fix.

Of course, it’s all very fine fixing something for one browser but it’s worthless if it breaks the presentation in other browsers. In that vein, I did some testing in Opera, Firefox, Seamonkey and Safari to check if all was well and it was. There may be older browsers, like versions of IE prior to 6, where things don’t appear as intended but I get the impression from my visitor statistics that the newer variants hold sway anyway. All in all, it was a useful lesson learnt and that’s never a bad thing.

About workspaces…

16th November 2007

One of the nice things about the world of Linux and UNIX is the availability of multiple workspaces. In Window, you only ever get one and the likes of me can easily fill up that task bar. So the idea of parceling off different applications to different screens is useful from a housekeeping point of view so long as icons only appear in the task bar foe the open workspace; Ubuntu respects this but openSUSE doesn’t, a possible source of irritation.

However, a case can be made that UNIX/Linux needs workspaces more than Windows because of the multi-window interfaces of some of the software applications. The trouble with each of these sub-windows is that an entry appears in the task bar for each of this, creating a mess very quickly. And it can also be an issue working out which window closes the lot.

Examples of the above that come to my mind include GIMP, XSane and SAS. The Windows version of the latter’s DMS is confined to a single application window while the UNIX incarnation is composed of a window each for individual components like program editor, log, output, etc. Typing "bye" in the command line of the program editor is enough to dispatch the GUI. With GIMP, Ctrl+Q will close it down in any window apart from the "Tip of the Day" one that pops up when GIMP is first started. The same sort of behaviour also seems to dispatch XSane too.

Switching form one workspace to another is as easy as clicking the relevant icon in the task bar in all of the UNIX variants that I have used. Switching an application from one workspace to another has another common thread: finding the required entry in the application window menu.

In Ubuntu, I have seen other ways of working with workspaces. In the interface with visual effects turned off, hovering over the workspace icons in the task bar allows you to move from one to another with the wheel of your mouse. Moving an application between workspaces can be done as simply as dragging boxes from one task bar icon to another. Turning on the visual effects changes things, though. It might appear that the original functionality still works but that seems not to be the case: a matter for Canonical to resolve, perhaps?

The visual effects do provide other ways around this though. Keeping all your application windows minimised means that you can run through workspaces themselves with your wheel mouse. Moving applications between workspaces becomes as simple as grabbing the title bar and pulling the window left or right until it changes workspace. Be careful that you do the job fully though or you could have an application sitting astride two workspaces. It would appear that ideas from the sharing of a desktop across multiple monitors have percolated through to workspace behaviour.

Aside (regarding Ubuntu visual effects): I don’t know who came up with the idea of having windows wobble when they’re being moved around but it certainly is unusual, as is seeing what happens when you try prising a docked window from its mooring (particularly when you’re pulling it up from the bottom task bar). The sharper font display and bevelled screen furniture make more sense to me though; they certainly make a UI more appealing and modern.

  • All the views that you find expressed on here in postings and articles are mine alone and not those of any organisation with which I have any association, through work or otherwise. As regards editorial policy, whatever appears here is entirely of my own choice and not that of any other person or organisation.

  • Please note that everything you find here is copyrighted material. The content may be available to read without charge and without advertising but it is not to be reproduced without attribution. As it happens, a number of the images are sourced from stock libraries like iStockPhoto so they certainly are not for abstraction.

  • With regards to any comments left on the site, I expect them to be civil in tone of voice and reserve the right to reject any that are either inappropriate or irrelevant. Comment review is subject to automated processing as well as manual inspection but whatever is said is the sole responsibility of the individual contributor.